Locally Adaptive Metric Nearest Neighbor Classiication

نویسندگان

  • Carlotta Domeniconi
  • Jing Peng
  • Dimitrios Gunopulos
چکیده

Nearest neighbor classiication assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with nite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose a locally adaptive nearest neighbor classiication method to try to minimize bias. We use a Chi-squared distance analysis to compute a exible metric for producing neighborhoods that are highly adaptive to query locations. Neighborhoods are elongated along less relevant feature dimensions and constricted along most innuential ones. As a result, the class conditional probabilities tend to be smoother in the mod-iied neighborhoods, whereby better classiication performance can be achieved. The eecacy of our method is validated and compared against other techniques using a variety of simulated and real world data.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adaptive Nearest Neighbor Classiication Using Support Vector Machines

The nearest neighbor technique is a simple and appealing method to address classiication problems. It relies on the assumption of locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with a nite number of examples due to the curse of dimensionality. We propose a technique that computes a locally exible metric by means of Support Vector Machines (S...

متن کامل

Adaptive Nearest Neighbor Classi cation using

The nearest neighbor technique is a simple and appealing method to address classi-cation problems. It relies on the assumption of locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with a nite number of examples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. The empl...

متن کامل

Adaptive Metric nearest Neighbor Classification

Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with finite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose a locally adaptive nearest neighbor classification method to try to minimize bias. We use a Chisqu...

متن کامل

Nearest Neighbor Classiication with a Local Asymmetrically Weighted Metric

This paper introduces a new local asymmetric weighting scheme for the nearest neighbor classiication algorithm. It is shown both with theoretical arguments and computer experiments that good compression rates can be achieved outperforming the accuracy of the standard nearest neighbor classiication algorithm and obtaining almost the same accuracy as the k-NN algorithm with k optimised in each da...

متن کامل

BoostML: An Adaptive Metric Learning for Nearest Neighbor Classification

The nearest neighbor classification/regression technique, besides its simplicity, is one of the most widely applied and well studied techniques for pattern recognition in machine learning. A nearest neighbor classifier assumes class conditional probabilities to be locally smooth. This assumption is often invalid in high dimensions and significant bias can be introduced when using the nearest ne...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002